Moores lag handlade ursprungligen om hur många transistorer som får plats på en given yta. I praktiken har den bland annat tolkats som att beräkningskraften hos datorer fördubblas var artonde månad. Nu har forskare vid Stanford University visat att den gäller också för strömförbrukningen:
The idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half. (Jonathan Koomey, till Technology Review.)
Uppdatering: Får via Staffan Malmgren en länk till The Atlantic, som räknat lite på vad det här betyder:
Imagine you've got a shiny computer that is identical to a Macbook Air, except that it has the energy efficiency of a machine from 20 years ago. That computer would use so much power that you'd get a mere 2.5 seconds of battery life out of the Air's 50 watt-hour battery instead of the seven hours that the Air actually gets. That is to say, you'd need 10,000 Air batteries to run our hypothetical machine for seven hours. There's no way you'd fit a beast like that into a slim mailing envelope. (The Atlantic.)